Rights Group Urges Social Media to Save Evidence of War Crimes
2020-09-23
LRC
TXT
大字
小字
滚动
全页
1A rights group has called on social media companies to save images and video that could be used as evidence to investigate serious crimes.
2Companies like Facebook, YouTube and Twitter are removing material thought to be offensive or illegal, Human Rights Watch says in a new report.
3It noted that such material could be presented as evidence at trials for persons accused of war crimes or other charges.
4The group said it understands why social media companies see the need to remove some material, including content that supports or incites violence.
5But it is urging the companies to collect and save such material for possible use in criminal cases.
6In addition to using human monitors, social media services are increasingly turning to machine learning methods to remove content that violates their policies.
7Human rights researchers fear that the use of machines, without human intervention, could lead to important evidence being lost or destroyed.
8Belkis Wille is a Human Rights Watch researcher and helped prepare the new report.
9She told VOA that videos and images from social media are often used in many of her group's investigations.
10"What we started to notice in the last few years, particularly since 2017, is that we would see a video of let's say soldiers executing someone, or an (Islamic State) propaganda video.
11If 15 minutes or an hour later we went back to look at a video again, it was suddenly gone," Wille said.
12The report noted video evidence collected from social media by the international investigative group Bellingcat.
13These videos showed a Russian "Buk" ground-to-air missile launcher.
14International investigators say the device was used to fire the weapon that brought down Malaysian Airlines Flight MH17.
15The flight crashed in eastern Ukraine on July 17, 2014.
16All 298 people on the airplane were killed.
17The Dutch-led Joint Investigation Team later presented the videos as evidence.
18Russia has denied involvement in the incident.
19Belkis Wille said that during Bellingcat's investigation, the group went to look for evidence it had found earlier on social media, but the material had been removed.
20Governments are putting increasing pressure on online companies to remove offensive, illegal or dangerous material from the internet.
21Social media companies promised to do more to block extremist content after the live-streaming on Facebook of a terror attack on two Islamic centers in New Zealand in 2019.
22Fifty-one people died in the attack.
23Social media companies have told Human Rights Watch they are required by law to remove material that could be offensive or incite terror, violence or hatred.
24Wille told VOA that the removal systems are now so effective "that they are taking down content the minute it gets posted. So, no user actually gets to see that content before it comes down."
25Syrian Archive is using social media videos to document possible war crimes, including the use of chemical weapons.
26The rights group has also raised concerns about important evidence being removed from social media before it can be saved and examined.
27Wille said that one solution could be the creation of an international registry or archive system for collecting online images and video.
28The people responsible for this record-keeping system would decide who should get access to the content, she said.
29The collected material would be kept for "investigative purposes."
30Human Rights Watch says it is in talks with social media companies about creating such an archive.
31I'm Bryan Lynn.
1A rights group has called on social media companies to save images and video that could be used as evidence to investigate serious crimes. 2Companies like Facebook, YouTube and Twitter are removing material thought to be offensive or illegal, Human Rights Watch says in a new report. It noted that such material could be presented as evidence at trials for persons accused of war crimes or other charges. 3The group said it understands why social media companies see the need to remove some material, including content that supports or incites violence. But it is urging the companies to collect and save such material for possible use in criminal cases. 4In addition to using human monitors, social media services are increasingly turning to machine learning methods to remove content that violates their policies. Human rights researchers fear that the use of machines, without human intervention, could lead to important evidence being lost or destroyed. 5Belkis Wille is a Human Rights Watch researcher and helped prepare the new report. She told VOA that videos and images from social media are often used in many of her group's investigations. 6"What we started to notice in the last few years, particularly since 2017, is that we would see a video of let's say soldiers executing someone, or an (Islamic State) propaganda video. If 15 minutes or an hour later we went back to look at a video again, it was suddenly gone," Wille said. 7The report noted video evidence collected from social media by the international investigative group Bellingcat. These videos showed a Russian "Buk" ground-to-air missile launcher. International investigators say the device was used to fire the weapon that brought down Malaysian Airlines Flight MH17. The flight crashed in eastern Ukraine on July 17, 2014. All 298 people on the airplane were killed. 8The Dutch-led Joint Investigation Team later presented the videos as evidence. Russia has denied involvement in the incident. 9Belkis Wille said that during Bellingcat's investigation, the group went to look for evidence it had found earlier on social media, but the material had been removed. 10Governments are putting increasing pressure on online companies to remove offensive, illegal or dangerous material from the internet. Social media companies promised to do more to block extremist content after the live-streaming on Facebook of a terror attack on two Islamic centers in New Zealand in 2019. 11Fifty-one people died in the attack. 12Social media companies have told Human Rights Watch they are required by law to remove material that could be offensive or incite terror, violence or hatred. 13Wille told VOA that the removal systems are now so effective "that they are taking down content the minute it gets posted. So, no user actually gets to see that content before it comes down." 14Syrian Archive is using social media videos to document possible war crimes, including the use of chemical weapons. The rights group has also raised concerns about important evidence being removed from social media before it can be saved and examined. 15Wille said that one solution could be the creation of an international registry or archive system for collecting online images and video. 16The people responsible for this record-keeping system would decide who should get access to the content, she said. The collected material would be kept for "investigative purposes." 17Human Rights Watch says it is in talks with social media companies about creating such an archive. 18I'm Bryan Lynn. 19Henry Ridgwell reported this story for VOA News. Bryan Lynn adapted the report for Learning English. George Grow was the editor. 20We want to hear from you. Write to us in the Comments section, and visit our Facebook page. 21_______________________________________________________________ 22Words in This Story 23content - n. information contained in a piece of writing, a speech, a movie or on the internet 24monitor - n. a person whose job it is to watch or notice particular things 25notice - v. to see or become aware of something 26live-streaming - n. broadcasting content live on the internet 27post - v. to publish a piece of information on the internet 28access - n. the right or opportunity to use or see something